Search results for "Music Information Retrieval"

showing 10 items of 21 documents

Semantic structures of timbre emerging from social and acoustic descriptions of music

2011

The perceptual attributes of timbre have inspired a considerable amount of multidisciplinary research, but because of the complexity of the phenomena, the approach has traditionally been confined to laboratory conditions, much to the detriment of its ecological validity. In this study, we present a purely bottom-up approach for mapping the concepts that emerge from sound qualities. A social media ( http://www.last.fm ) is used to obtain a wide sample of verbal descriptions of music (in the form of tags) that go beyond the commonly studied concept of genre, and from this the underlying semantic structure of this sample is extracted. The structure that is thereby obtained is then evaluated th…

Acoustics and UltrasonicsComputer scienceEcological validityMusic information retrievalsointiväriSpeech recognitionmusiikkisosiaalinen mediacomputer.software_genreTimbreSimilarity (psychology)Social media.Music information retrievalElectrical and Electronic EngineeringSet (psychology)Structure (mathematical logic)Music psychologybusiness.industryNatural language processingVector-based semantic analysisDegree (music)acoustic featuresakustiset piirteetArtificial intelligencebusinessTimbrecomputerNatural language processingEURASIP Journal on Audio, Speech, and Music Processing
researchProduct

Capturing the musical brain with Lasso: Dynamic decoding of musical features from fMRI data.

2013

We investigated neural correlates of musical feature processing with a decoding approach. To this end, we used a method that combines computational extraction of musical features with regularized multiple regression (LASSO). Optimal model parameters were determined by maximizing the decoding accuracy using a leave-one-out cross-validation scheme. The method was applied to functional magnetic resonance imaging (fMRI) data that were collected using a naturalistic paradigm, in which participants' brain responses were recorded while they were continuously listening to pieces of real music. The dependent variables comprised musical feature time series that were computationally extracted from the…

AdultMaleCognitive NeuroscienceSpeech recognitionAuditory cortexbehavioral disciplines and activitiesBrain mappingHippocampusSuperior temporal gyrusYoung AdultGyrusCerebellummedicineImage Processing Computer-AssistedMusic information retrievalHumansAuditory CortexNeural correlates of consciousnessBrain Mappingmedicine.diagnostic_testSignal Processing Computer-AssistedMagnetic Resonance Imaginghumanitiesmedicine.anatomical_structureNeurologyta6131Auditory PerceptionFemalePsychologyFunctional magnetic resonance imagingDecoding methodsMusicNeuroImage
researchProduct

A matlab toolbox for music information retrieval

2008

We present MIRToolbox, an integrated set of functions written in Matlab, dedicated to the extraction from audio files of musical features related, among others, to timbre, tonality, rhythm or form. The objective is to offer a state of the art of computational approaches in the area of Music Information Retrieval (MIR). The design is based on a modular framework: the different algorithms are decomposed into stages, formalized using a minimal set of elementary mechanisms, and integrating different variants proposed by alternative approaches — including new strategies we have developed —, that users can select and parametrize. These functions can adapt to a large area of objects as input.

Audio signalInformation retrievalComputer sciencebusiness.industryModular designSet (abstract data type)Music information retrievalState (computer science)TonalitybusinessMATLABcomputerTimbrecomputer.programming_language
researchProduct

Tempo Induction from Music Recordings Using Ensemble Empirical Mode Decomposition Analysis

2011

Tempo and beat are among the most important features of Western music. Owing to the perceptual nature of tempo, its automatic analysis and extraction remains a difficult task for a large variety of music genres. Western music notation represents musical events using a hierarchical metrical structure distinguishing different time scales. This hierarchy is often modeled using three levels: the tatum, the tactus, and the measure. The tatum represents the shortest durational value in music that is not just an accidental phenomenon (Bilmes 1993). The tactus period is the most perceptually prominent period, and is the period at which most humans would tap their feet in time with the music (Lerdah…

Computer scienceSpeech recognitionmedia_common.quotation_subjectMusicalNotationHilbert–Huang transformComputer Science ApplicationsRhythmAudio editing softwarePerceptionMedia TechnologyMusic information retrievalBeat (music)Musicmedia_commonComputer Music Journal
researchProduct

Exploring Frequency-Dependent Brain Networks from Ongoing EEG Using Spatial ICA During Music Listening

2020

Recently, exploring brain activity based on functional networks during naturalistic stimuli especially music and video represents an attractive challenge because of the low signal-to-noise ratio in collected brain data. Although most efforts focusing on exploring the listening brain have been made through functional magnetic resonance imaging (fMRI), sensor-level electro- or magnetoencephalography (EEG/MEG) technique, little is known about how neural rhythms are involved in the brain network activity under naturalistic stimuli. This study exploited cortical oscillations through analysis of ongoing EEG and musical feature during freely listening to music. We used a data-driven method that co…

DYNAMICS6162 Cognitive scienceBrain activity and meditationComputer scienceSpeech recognitionIndependent components analysisElectroencephalographyACTIVATIONSuperior temporal gyrus0302 clinical medicineMusic information retrievalaivotutkimusEEGindependent components analysisBrain MappingRadiological and Ultrasound Technologymedicine.diagnostic_test05 social sciencesBrainElectroencephalographyhumanitiesEMOTIONSNeurologyFeature (computer vision)Auditory PerceptionALPHA-BANDFrequency-specific networks; Music information retrieval; EEG; Independent components analysisfrequency-specific networksAnatomyaivotTOOLBOX515 PsychologyMusic information retrievalmusic information retrievalmusiikkibehavioral disciplines and activitieskuunteleminen050105 experimental psychologyTIMBRE03 medical and health sciencesOSCILLATIONSmedicineHumans0501 psychology and cognitive sciencesRadiology Nuclear Medicine and imagingPERCEPTIONOriginal PaperATTENTIONtaajuusMagnetoencephalographyaivokuoriFrequency-specific networksNeurology (clinical)Functional magnetic resonance imaginghuman activitiesTimbreMusic030217 neurology & neurosurgeryRESPONSESBrain Topography
researchProduct

Exploring relationships between audio features and emotion in music

2009

In this paper, we present an analysis of the associations between emotion categories and audio features automatically extracted from raw audio data. This work is based on 110 excerpts from film soundtracks evaluated by 116 listeners. This data is annotated with 5 basic emotions (fear, anger, happiness, sadness, tenderness) on a 7 points scale. Exploiting state-of-the-art Music Information Retrieval (MIR) techniques, we extract audio features of different kind: timbral, rhythmic and tonal. Among others we also compute estimations of dissonance, mode, onset rate and loudness. We study statistical relations between audio descriptors and emotion categories confirming results from psychological …

Emotion classificationmedia_common.quotation_subjectSpeech recognitionAngerLoudnessSadnessBehavioral NeurosciencePsychiatry and Mental healthRaw audio formatMode (music)Neuropsychology and Physiological PsychologyNeurologyHappinessMusic information retrievalPsychologyBiological Psychiatrymedia_commonFrontiers in Human Neuroscience
researchProduct

Genre-adaptive Semantic Computing and Audio-based Modelling for Music Mood Annotation

2016

This study investigates whether taking genre into account is beneficial for automatic music mood annotation in terms of core affects valence, arousal, and tension, as well as several other mood scales. Novel techniques employing genre-adaptive semantic computing and audio-based modelling are proposed. A technique called the ACTwg employs genre-adaptive semantic computing of mood-related social tags, whereas ACTwg-SLPwg combines semantic computing and audio-based modelling, both in a genre-adaptive manner. The proposed techniques are experimentally evaluated at predicting listener ratings related to a set of 600 popular music tracks spanning multiple genres. The results show that ACTwg outpe…

ExploitMusic information retrievalmusic information retrievalcomputer.software_genre050105 experimental psychologyGenre-adaptive.030507 speech-language pathology & audiology03 medical and health sciencesAnnotationPopular musicSemantic computingMusic information retrieval0501 psychology and cognitive sciencesValence (psychology)genre-adaptivesocial tagsta113music genrebusiness.industry05 social sciencesComputingMilieux_PERSONALCOMPUTINGmood predictionMusic moodHuman-Computer InteractionMoodta6131semantic computingArtificial intelligence0305 other medical sciencebusinessPsychologycomputerSoftwareNatural language processing
researchProduct

Semantic Computing of Moods Based on Tags in Social Media of Music

2014

Social tags inherent in online music services such as Last.fm provide a rich source of information on musical moods. The abundance of social tags makes this data highly beneficial for developing techniques to manage and retrieve mood information, and enables study of the relationships between music content and mood representations with data substantially larger than that available for conventional emotion research. However, no systematic assessment has been done on the accuracy of social tags and derived semantic models at capturing mood information in music. We propose a novel technique called Affective Circumplex Transformation (ACT) for representing the moods of music tracks in an interp…

FOS: Computer and information sciencesVocabularyComputer scienceMusic information retrievalmedia_common.quotation_subjectSemantic analysis (machine learning)Moodscomputer.software_genreAffect (psychology)SemanticsComputer Science - Information RetrievalSemantic computingMusic information retrievalAffective computingmedia_commonSocial and Information Networks (cs.SI)ta113Probabilistic latent semantic analysisSocial tagsbusiness.industryComputer Science - Social and Information NetworksMultimedia (cs.MM)Semantic analysisComputer Science ApplicationsMoodComputational Theory and MathematicsWeb miningta6131Vector space modelArtificial intelligenceGenresbusinesscomputerComputer Science - MultimediaInformation Retrieval (cs.IR)MusicNatural language processingPrediction.Information SystemsIEEE Transactions on Knowledge and Data Engineering
researchProduct

The chronnectome of musical beat

2020

Keeping time is fundamental for our everyday existence. Various isochronous activities, such as locomotion, require us to use internal timekeeping. This phenomenon comes into play also in other human pursuits such as dance and music. When listening to music, we spontaneously perceive and predict its beat. The process of beat perception comprises both beat inference and beat maintenance, their relative importance depending on the salience of beat in the music. To study functional connectivity associated with these processes in a naturalistic situation, we used functional magnetic resonance imaging to measure brain responses of participants while they were listening to a piece of music contai…

MalePower graph analysisPeriodicityInferencemusiikkipsykologiatoiminnallinen magneettikuvaus0302 clinical medicineCerebellumMusic information retrievalDefault mode networkmedia_commonmedicine.diagnostic_testfMRI05 social sciencesMotor CortexMagnetic Resonance ImagingBeatNeurologyAuditory PerceptionFemalePsychologybeatCognitive psychologyAdultNaturalistic imagingMusic information retrievalCognitive Neurosciencemedia_common.quotation_subjectmusic information retrievaldynamic connectivity050105 experimental psychologylcsh:RC321-571Young Adult03 medical and health sciencesPerceptionConnectomemedicineHumansmusic0501 psychology and cognitive scienceslcsh:Neurosciences. Biological psychiatry. NeuropsychiatryAuditory Cortexnaturalistic imagingrytmiDynamic connectivityAcoustic Stimulationkognitiivinen neurotiedeCentralityFunctional magnetic resonance imagingBeat (music)Music030217 neurology & neurosurgeryNeuroImage
researchProduct

2021

Background and objectives Music has a unique capacity to evoke both strong emotions and vivid autobiographical memories. Previous music information retrieval (MIR) studies have shown that the emotional experience of music is influenced by a combination of musical features, including tonal, rhythmic, and loudness features. Here, our aim was to explore the relationship between music-evoked emotions and music-evoked memories and how musical features (derived with MIR) can predict them both. Methods Healthy older adults (N = 113, age ≥ 60 years) participated in a listening task in which they rated a total of 140 song excerpts comprising folk songs and popular songs from 1950s to 1980s on five …

MultidisciplinarySalience (language)Music psychologyAutobiographical memory05 social sciencesMusicalPulse (music)050105 experimental psychology03 medical and health sciences0302 clinical medicinePopular musicMusic information retrieval0501 psychology and cognitive sciencesValence (psychology)Psychology030217 neurology & neurosurgeryCognitive psychologyPLOS ONE
researchProduct